Efficient approximation of probability distributions with k-order decomposable models
نویسندگان
چکیده
During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for k > 2. In this work, we propose the fractal tree family of algorithms which approximates this problem with a computational complexity of O(k · n ·N) in the worst case, where n is the number of implied random variables and N is the size of the training set. The fractal tree algorithms construct a sequence of maximal i-order decomposable graphs, for i = 2, ..., k, in k − 1 steps. At each step, the algorithms follow a divide-and-conquer strategy that decomposes the problem into a set of separator problems. Each separator problem is efficiently solved using the generalized Chow-Liu algorithm. Fractal trees can be considered a natural extension of the Chow-Liu algorithm, from k = 2 to arbitrary values of k, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their competitive behavior, their low computational complexity and their modularity, which allow them to implement different parallelization strategies, the proposed procedures are especially advisable for modeling high dimensional domains.
منابع مشابه
Approximation of Data by Decomposable Belief Models
It is well known that among all probabilistic graphical Markov models the class of decomposable models is the most advantageous in the sense that the respective distributions can be expressed with the help of their marginals and that the most efficient computational procedures are designed for their processing (for example professional software does not perform computations with Bayesian networ...
متن کاملApproximating discrete probability distributions with decomposable models
A heuristic procedure is presented to approximate an 71dimensional discrete probability distribution with a decomposable model of a given complexity. It is shown that, without loss of generality, the search space can be restricted to a suitable subclass of decomposable models, whose members are called elementary models. The selected elementary model is constructed in an incremental manner accor...
متن کاملA continuous approximation fitting to the discrete distributions using ODE
The probability density functions fitting to the discrete probability functions has always been needed, and very important. This paper is fitting the continuous curves which are probability density functions to the binomial probability functions, negative binomial geometrics, poisson and hypergeometric. The main key in these fittings is the use of the derivative concept and common differential ...
متن کاملSpatial Latent Gaussian Models: Application to House Prices Data in Tehran City
Latent Gaussian models are flexible models that are applied in several statistical applications. When posterior marginals or full conditional distributions in hierarchical Bayesian inference from these models are not available in closed form, Markov chain Monte Carlo methods are implemented. The component dependence of the latent field usually causes increase in computational time and divergenc...
متن کاملAn Approximate Formulation based on Integer Linear Programming for Learning Maximum Weighted (k+1)-order Decomposable Graphs
In this work we deal with the problem of learning a maximum weighted (k + 1)order decomposable graph coarser than a given maximal k-order decomposable graph. An Integer Linear Programming formulation for the problem has been recently proposed and used in order to solve instances of the problem with a moderate number of vertices [8]. However, as the problem is known to be NP-hard, it is of pract...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Int. J. Approx. Reasoning
دوره 74 شماره
صفحات -
تاریخ انتشار 2016